Bayesian Classification With Gaussian Processes
نویسندگان
چکیده
We consider the problem of assigning an input vector to one of m classes by predicting P(c|x) for c = 1, o, m. For a twoclass problem, the probability of class one given x is estimated by s(y(x)), where s(y) = 1/(1 + ey ). A Gaussian process prior is placed on y(x), and is combined with the training data to obtain predictions for new x points. We provide a Bayesian treatment, integrating over uncertainty in y and in the parameters that control the Gaussian process prior; the necessary integration over y is carried out using Laplace’s approximation. The method is generalized to multiclass problems (m > 2) using the softmax function. We demonstrate the effectiveness of the method on a number of datasets.
منابع مشابه
Variational Gaussian process classifiers
Gaussian processes are a promising nonlinear regression tool, but it is not straightforward to solve classification problems with them. In this paper the variational methods of Jaakkola and Jordan are applied to Gaussian processes to produce an efficient Bayesian binary classifier.
متن کاملA latent variable Gaussian process model with Pitman-Yor process priors for multiclass classification
Mixtures of Gaussian processes have been considered by several researchers as a means of dealing with non-stationary covariance functions, discontinuities, multi-modality, and overlapping output signals in the context of regression tasks. In this paper, for the first time in the literature, we devise a Gaussian process mixture model especially suitable for multiclass classification applications...
متن کاملIntegrating Gaussian Processes with Word-Sequence Kernels for Bayesian Text Categorization
We address the problem of multi-labelled text classification using word-sequence kernels. However, rather than applying them with Support Vector Machine as in previous work, we chose a classifier based on Gaussian Processes. This is a probabilistic non-parametric method that retains a sound probabilistic semantics while overcoming the limitations of parametric methods. We present the empirical ...
متن کاملProbabilistic Programming with Gaussian Process Memoization
Gaussian Processes (GPs) are widely used tools in statistics, machine learning, robotics, computer vision, and scientific computation. However, despite their popularity, they can be difficult to apply; all but the simplest classification or regression applications require specification and inference over complex covariance functions that do not admit simple analytical posteriors. This paper sho...
متن کاملNonparametric Bayesian Density Modeling with Gaussian Processes
The Gaussian process is a useful prior on functions for Bayesian kernel regression and classification. Density estimation with a Gaussian process prior is difficult, however, as densities must be nonnegative and integrate to unity. The statistics community has explored the use of a logistic Gaussian process for density estimation, relying on approximations of the normalization constant (e.g. [1...
متن کاملBayesian Warped Gaussian Processes
Warped Gaussian processes (WGP) [1] model output observations in regression tasks as a parametric nonlinear transformation of a Gaussian process (GP). The use of this nonlinear transformation, which is included as part of the probabilistic model, was shown to enhance performance by providing a better prior model on several data sets. In order to learn its parameters, maximum likelihood was used...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- IEEE Trans. Pattern Anal. Mach. Intell.
دوره 20 شماره
صفحات -
تاریخ انتشار 1998